Ollama 0.9.0
Ollama 0.9.0
New models
- DeepSeek-R1-2508: DeepSeek-R1์ 80์ต ๊ฐ์ ๋งค๊ฐ๋ณ์ ์ฆ๋ฅ ๋ชจ๋ธ๊ณผ 6,710์ต ๊ฐ์ ์ ์ฒด ๋งค๊ฐ๋ณ์ ๋ชจ๋ธ์ ๋ํด DeepSeek-R1-0528๋ก ๋ง์ด๋ ๋ฒ์ ์ ๊ทธ๋ ์ด๋๋ฅผ ๋ฐ์์ต๋๋ค. ์ด๋ฒ ์ ๋ฐ์ดํธ์์ DeepSeek R1์ ์ถ๋ก ๋ฐ ์ถ๋ก ๊ธฐ๋ฅ์ด ํฌ๊ฒ ํฅ์๋์์ต๋๋ค.
Thinking
์ด์ ์ฌ๋ผ๋ง์ ์ฌ๊ณ ๊ธฐ๋ฅ์ ํ์ฑํ ๋๋ ๋นํ์ฑํํ ์ ์์ต๋๋ค. ์ด๋ฅผ ํตํด ์ฌ์ฉ์๋ ๋ค์ํ ์ ํ๋ฆฌ์ผ์ด์ ๊ณผ ์ฌ์ฉ ์ฌ๋ก์ ๋ง๊ฒ ๋ชจ๋ธ์ ์ฌ๊ณ ๋์์ ์ ์ฐํ๊ฒ ์ ํํ ์ ์์ต๋๋ค.
์ฌ๊ณ ๊ฐ ํ์ฑํ๋๋ฉด ๋ชจ๋ธ์ ์ถ๋ ฅ์์ ๋ชจ๋ธ์ ์ฌ๊ณ ๊ฐ ๋ถ๋ฆฌ๋ฉ๋๋ค. ์ฌ๊ณ ๊ฐ ๋นํ์ฑํ๋๋ฉด ๋ชจ๋ธ์ ์ฌ๊ณ ํ์ง ์๊ณ ์ฝํ ์ธ ๋ฅผ ์ง์ ์ถ๋ ฅํ์ง ์์ต๋๋ค.
Models that support thinking:
- DeepSeek R1
- Qwen 3
- more will be added under thinking models
When running a model that supports thinking, Ollama will now display the model's thoughts:
% ollama run deepseek-r1
>>> How many Rs are in strawberry
Thinking...
First, I need to understand what the question is asking. It's asking how many letters 'R' are present in the word "strawberry."
Next, I'll examine each letter in the word individually.
I'll start from the beginning and count every occurrence of the letter 'R.'
After reviewing all the letters, I determine that there are three instances where the letter 'R' appears in the word "strawberry."
...done thinking.
There are three **Rs** in the word **"strawberry"**.
In Ollama's API, a model's thinking is now returned as a separate thinking
field for easy parsing:
{
"message": {
"role": "assistant",
"thinking": "First, I need to understand what the question is asking. It's asking how many letters 'R' are present in the word "strawberry...",
"content": "There are **3** instances of the letter **R** in the word **"strawberry."**"
}
}
Turning thinking on and off
In the API, thinking can be enabled by passing "think": true
and disabled by passing "think": false
curl http://localhost:11434/api/chat -d '{
"model": "deepseek-r1",
"messages": [
{
"role": "user",
"content": "Why is the sky blue?"
},
],
"think": true
}'
In Ollama's CLI, use /set think
and /set nothink
to enable and disable thinking.
What's Changed
- Add thinking mode support to Ollama
Full Changelog: https://github.com/ollama/ollama/releases/tag/v0.9.0